Feature Adjustment in Kernel Space when using Cross-Validation

نویسندگان

  • Anil Rao
  • Janaina Mourao-Miranda
چکیده

Kernel methods are a powerful set of techniques for learning from data. One of the attractive properties of these techniques is that they rely only on a kernel function which provides the user-defined notion of similarity between two observations, to train the models. This report describes a strategy for evaluating kernel-based predictive models within a cross-validation framework when we also have a set of confounds which are used to ‘adjust’ the data based on their relationship with the data within the training sample. We show that for a linear kernel function, this adjustment can be performed in kernel space which removes the need to reload or retain the data in memory during the cross-validation procedure. Furthermore, we show how a similar strategy can be used with other kernel functions such as the Gaussian Radial Basis function. Feature Adjustment in Kernel Space when using Cross-Validation Anil Rao Janaina Mourao-Miranda

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Geometry Preserving Kernel over Riemannian Manifolds

Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...

متن کامل

Nonparametric Tree Graphical Models via Kernel Embeddings

We introduce a nonparametric representation for graphical model on trees which expresses marginals as Hilbert space embeddings and conditionals as embedding operators. This formulation allows us to define a graphical model solely on the basis of the feature space representation of its variables. Thus, this nonparametric model can be applied to general domains where kernels are defined, handling...

متن کامل

Kernel discriminant analysis based feature selection

For two-class problems we propose two feature selection criteria based on kernel discriminant analysis (KDA). The first one is the objective function of kernel discriminant analysis called the KDA criterion. We show that the KDA criterion is monotonic for the deletion of features, which ensures stable feature selection. The second one is the recognition rate obtained by a KDA classifier, called...

متن کامل

Learning Kernel Parameters by using Class Separability Measure

Learning kernel parameters is important for kernel based methods because these parameters have significant impact on the generalization abilities of these methods. Besides the methods of Cross-Validation and Leave-One-Out, minimizing some upper bounds on the generalization error, such as the radius-margin bound, was also proposed to more efficiently learn the optimal kernel parameters. In this ...

متن کامل

Input Output Kernel Regression: Supervised and Semi-Supervised Structured Output Prediction with Operator-Valued Kernels

In this paper, we introduce a novel approach, called Input Output Kernel Regression (IOKR), for learning mappings between structured inputs and structured outputs. The approach belongs to the family of Output Kernel Regression methods devoted to regression in feature space endowed with some output kernel. In order to take into account structure in input data and benefit from kernels in the inpu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017